On the Semantic Relationship between Probabilistic Soft Logic and Markov Logic

نویسندگان

  • Joohyung Lee
  • Yi Wang
چکیده

Markov Logic Networks (MLN) and Probabilistic Soft Logic (PSL) are widely applied formalisms in Statistical Relational Learning, an emerging area in Artificial Intelligence that is concerned with combining logical and statistical AI. Despite their resemblance, the relationship has not been formally stated. In this paper, we describe the precise semantic relationship between them from a logical perspective. This is facilitated by first extending fuzzy logic to allow weights, which can be also viewed as a generalization of PSL, and then relate that generalization to MLN. We observe that the relationship between PSL and MLN is analogous to the known relationship between fuzzy logic and Boolean logic, and furthermore the weight scheme of PSL is essentially a generalization of the weight scheme of MLN for the many-valued setting. Introduction Statistical relational learning (SRL) is an emerging area in Artificial Intelligence that is concerned with combining logical and statistical AI. Markov Logic Networks (MLN) (Richardson and Domingos 2006) and Probabilistic Soft Logic (PSL) (Kimmig et al. 2012; Bach et al. 2015) are well-known formalisms in statistical relational learning, and have been successfully applied to a wide range of AI applications, such as natural language processing, entity resolution, collective classification, and social network modeling. Both of them combine logic and probabilistic graphical model in a single representation, where each formula is associated with a weight, and the probability distribution over possible worlds is derived from the weights of the formulas that are satisfied by the possible worlds. However, despite their resemblance to each other, the precise relationship between their semantics is not obvious. PSL is based on fuzzy interpretations that range over reals in [0, 1], and in this sense is more general than MLN. On the other hand, its syntax is restricted to formulas in clausal form, unlike MLN that allows any complex formulas. It is also not obvious how their models’ weights are related to each other due to the different ways that the weights are associated with models. Originating from the machine learning research, these formalisms In Working Notes of the 6th International Workshop on Statistical Relational AI (StarAI 2016) are equipped with several efficient inference and learning algorithms, and some paper compares the suitability of one formalism over the other by experiments on specific applications (Beltagy, Erk, and Mooney 2014). On the other hand, the precise relationship between the two formalisms has not been formally stated. In this paper, we present a precise semantic relationship between them. We observe that the relationship is analogous to the well-known relationship between fuzzy logic and classical logic. Moreover, despite the different ways that weights of models are defined in each formalism, it turns out that they are essentially of the same kind. Towards this end, we introduce a weighted fuzzy logic as a proper generalization of PSL, which is also interesting on its own as an extension of the standard fuzzy logic to incorporate weighted models. The weighted fuzzy logic uses the same weight scheme as PSL, but associates weights to arbitrary fuzzy formulas. This intermediate formalism facilitates the comparison between PSL and MLN. We observe that the same analogy between fuzzy logic and Boolean logic carries over to between PSL and MLN. Analogous to that fuzzy logic agrees with Boolean logic on crisp interpretations, PSL and MLN agree on crisp interpretations, where their weights are proportional to each other. However, their maximum a posteriori (MAP) estimates do not necessarily coincide due to the differences between many-valued vs. Boolean models. The paper is organized as follows. We first review each of MLN, fuzzy propositional logic, and PSL. Then we define a weighted fuzzy logic as a generalization of PSL. Using this we study the semantic relationship between PSL and MLN.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Probabilistic Soft Logic for Semantic Textual Similarity

Probabilistic Soft Logic (PSL) is a recently developed framework for probabilistic logic. We use PSL to combine logical and distributional representations of natural-language meaning, where distributional information is represented in the form of weighted inference rules. We apply this framework to the task of Semantic Textual Similarity (STS) (i.e. judging the semantic similarity of naturallan...

متن کامل

Semantic Parsing using Distributional Semantics and Probabilistic Logic

We propose a new approach to semantic parsing that is not constrained by a fixed formal ontology and purely logical inference. Instead, we use distributional semantics to generate only the relevant part of an on-the-fly ontology. Sentences and the on-the-fly ontology are represented in probabilistic logic. For inference, we use probabilistic logic frameworks like Markov Logic Networks (MLN) and...

متن کامل

A Design Methodology for Reliable MRF-Based Logic Gates

Probabilistic-based methods have been used for designing noise tolerant circuits recently. In these methods, however, there is not any reliability mechanism that is essential for nanometer digital VLSI circuits. In this paper, we propose a novel method for designing reliable probabilistic-based logic gates. The advantage of the proposed method in comparison with previous probabilistic-based met...

متن کامل

Inferring User Preferences by Probabilistic Logical Reasoning over Social Networks

We propose a framework for inferring the latent attitudes or preferences of users by performing probabilistic first-order logical reasoning over the social network graph. Our method answers questions about Twitter users like Does this user like sushi? or Is this user a New York Knicks fan? by building a probabilistic model that reasons over user attributes (the user’s location or gender) and th...

متن کامل

Natural Language Semantics using Probabilistic Logic

With better natural language semantic representations, computers can do more applications more efficiently as a result of better understanding of natural text. However, no single semantic representation at this time fulfills all requirements needed for a satisfactory representation. Logic-based representations like first-order logic capture many of the linguistic phenomena using logical constru...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1606.08896  شماره 

صفحات  -

تاریخ انتشار 2016